Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Support structured outputs #255

Closed

Conversation

domenkozar
Copy link
Contributor

@domenkozar domenkozar commented Aug 12, 2024

Fixes #253

I'm not entirely sure why the whole file is re-added.

@domenkozar domenkozar force-pushed the response_format_json_schema branch 2 times, most recently from 23c9996 to feb270e Compare August 12, 2024 22:37
@nytopop
Copy link

nytopop commented Aug 13, 2024

I'm not entirely sure why the whole file is re-added.

The line endings have changed from LF to CRLF CRLF to LF. git diff -R will show the change.

It looks like there's some inconsistency on that front, might be worth a separate issue/PR to cleanup:

eric@io ~/src/code/rust/async-openai/async-openai/src/types (response_format_json_schema)> file *
assistant.rs:        ASCII text, with very long lines (502)
assistant_file.rs:   ASCII text
assistant_impls.rs:  ASCII text
assistant_stream.rs: ASCII text
audio.rs:            ASCII text, with very long lines (353), with CRLF line terminators
batch.rs:            ASCII text
chat.rs:             ASCII text, with very long lines (506)
common.rs:           ASCII text, with CRLF line terminators
completion.rs:       Unicode text, UTF-8 text, with very long lines (569), with CRLF line terminators
embedding.rs:        ASCII text, with very long lines (460), with CRLF line terminators
file.rs:             ASCII text, with very long lines (388)
fine_tuning.rs:      ASCII text, with CRLF line terminators
image.rs:            ASCII text, with CRLF line terminators
impls.rs:            ASCII text
message.rs:          ASCII text
message_file.rs:     ASCII text
mod.rs:              ASCII text
model.rs:            ASCII text, with CRLF line terminators
moderation.rs:       ASCII text, with very long lines (344), with CRLF line terminators
run.rs:              ASCII text, with very long lines (358)
step.rs:             ASCII text
thread.rs:           ASCII text, with very long lines (358)
vector_store.rs:     ASCII text

@domenkozar domenkozar force-pushed the response_format_json_schema branch from feb270e to 6cbbd8d Compare August 13, 2024 09:41
@domenkozar domenkozar requested a review from nytopop August 13, 2024 09:44
@domenkozar domenkozar force-pushed the response_format_json_schema branch 2 times, most recently from b02f5c0 to 92430cb Compare August 13, 2024 09:52
Copy link

@nytopop nytopop left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

LGTM pending a rustfmt but I'm not a maintainer, so 🤷

@domenkozar domenkozar force-pushed the response_format_json_schema branch from 92430cb to 2ff445e Compare August 13, 2024 14:36
@dr3s
Copy link

dr3s commented Aug 16, 2024

@64bit any chance this can be merged so we can use the new features of the API?

@64bit
Copy link
Owner

64bit commented Aug 29, 2024

@nytopop thank you for identifying the new line issue, I use non Windows machines to work on this repo, and there have been contributions from community from Windows hence the issue. Thanks for suggestion I have created a new issue for it at #259

@64bit
Copy link
Owner

64bit commented Aug 29, 2024

@domenkozar PR is always appreciated.

However it required a bit more work for following issues, which are addressed in #257 :

  1. While the change may work, they are semantically inaccurate, for example it cant prevent you to construct an object of type text and still having json_schema.
  2. In addition, it may also lead to issue when an object is serialized and deserialized and they end up not being same (please see The type of messages in deserialized CreateChatCompletionRequest are all SystemMessage #216)
  3. No supporting tests please see https://github.com/64bit/async-openai?tab=readme-ov-file#contributing

Hence, this PR can be closed. Thank you again!

@64bit 64bit closed this Aug 29, 2024
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

Add support for the structured output response format
4 participants